Learning Bayesian Networks is NP-Hard
نویسندگان
چکیده
Algorithms for learning Bayesian networks from data have two components: a scoring metric and a search procedure. The scoring metric computes a score reeecting the goodness-of-t of the structure to the data. The search procedure tries to identify network structures with high scores. Heckerman et al. (1994) introduced a Bayesian metric, called the BDe metric, that computes the relative posterior probability of a network structure given data. They show that the metric has a property desireable for inferring causal structure from data. In this paper, we show that the problem of deciding whether there is a Bayesian network|among those where each node has at most k parents|that has a relative posterior probability greater than a given constant is NP-complete, when the BDe metric is used.
منابع مشابه
Tractable Bayesian Network Structure Learning with Bounded Vertex Cover Number
Both learning and inference tasks on Bayesian networks are NP-hard in general. Bounded tree-width Bayesian networks have recently received a lot of attention as a way to circumvent this complexity issue; however, while inference on bounded tree-width networks is tractable, the learning problem remains NP-hard even for tree-width 2. In this paper, we propose bounded vertex cover number Bayesian ...
متن کاملLearning Optimal Bounded Treewidth Bayesian Networks via Maximum Satisfiability
Bayesian network structure learning is the well-known computationally hard problem of finding a directed acyclic graph structure that optimally describes given data. A learned structure can then be used for probabilistic inference. While exact inference in Bayesian networks is in general NP-hard, it is tractable in networks with low treewidth. This provides good motivations for developing algor...
متن کاملLearning Bayesian Networks From Dependency Networks: A Preliminary Study
In this paper we describe how to learn Bayesian networks from a summary of complete data in the form of a dependency network rather than from data directly. This method allows us to gain the advantages of both representations: scalable algorithms for learning dependency networks and convenient inference with Bayesian networks. Our approach is to use a dependency network as an “oracle” for the s...
متن کاملExact Learning of Bounded Tree-width Bayesian Networks
Inference in Bayesian networks is known to be NP-hard, but if the network has bounded treewidth, then inference becomes tractable. Not surprisingly, learning networks that closely match the given data and have a bounded tree-width has recently attracted some attention. In this paper we aim to lay groundwork for future research on the topic by studying the exact complexity of this problem. We gi...
متن کاملLearning Sparse Causal Models is not NP-hard
This paper shows that causal model discovery is not an NP-hard problem, in the sense that for sparse graphs bounded by node degree k the sound and complete causal model can be obtained in worst case order N independence tests, even when latent variables and selection bias may be present. We present a modification of the well-known FCI algorithm that implements the method for an independence ora...
متن کاملBayesian Networks Structure Learning Second Exam Literature Review
Bayesian networks are widely used graphical models which represent uncertainty relations between the random variables in a domain compactly and intuitively. The first step of applying Bayesian networks to real-word problems typically requires building the structure of the networks. Among others, score-based exact structure learning has become an active research topic in recent years. In this co...
متن کامل